189 research outputs found

    Spectral diffusion and 14N quadrupole splittings in absorption detected magnetic resonance hole burning spectra of photosynthetic reaction centers

    Get PDF
    Zero field absorption detected magnetic resonance hole burning measurements were performed on photosynthetic reaction centers of the bacteria Rhodobacter sphaeroides R26 and Rhodopseudomonas viridis. Extrapolation to zero microwave power yielded pseudohomogeneous linewidths of 2.0 MHz for Rhodopseudomonas viridis, 1.0 and 0.9 MHz for the protonated forms of Rhodobacter sphaeroides R26 with and without monomer bacteriochlorophyll exchanged, and 0.25 MHz as an upper limit for fully deuterated reaction centers of Rhodobacter sphaeroides R26. The measured linewidths were interpreted as being due to unresolved hyperfine interaction between the nuclear spins and the triplet electron spin, the line shape being determined by spectral diffusion among the nuclei. The difference in linewidths between Rhodobacter sphaeroides R26 and Rhodopseudomonas viridis is then explained by triplet delocalization on the special pair in the former, and localization on one dimer half on the latter. In the fully deuterated sample, four quadrupole satellites were observed in the hole spectra arising from the eight 14N nitrogens in the special pair. The quadrupole parameters seem to be very similar for all nitrogens and were determined to =1.25±0.1 MHz and =0.9±0.1 MHz. The Journal of Chemical Physics is copyrighted by The American Institute of Physics

    Refocusing generalised normalisation

    Get PDF
    When defined with general elimination/application rules, natural deduction and λ\lambda-calculus become closer to sequent calculus. In order to get real isomorphism, normalisation has to be defined in a ``multiary'' variant, in which reduction rules are necessarily non-local (reason: nomalisation, like cut-elimination, acts at the \emph{head} of applicative terms, but natural deduction focuses at the \emph{tail} of such terms). Non-local rules are bad, for instance, for the mechanization of the system. A solution is to extend natural deduction even further to a \emph{unified calculus} based on the unification of cut and general elimination. In the unified calculus, a sequent term behaves like in the sequent calculus, whereas the reduction steps of a natural deduction term are interleaved with explicit steps for bringing heads to focus. A variant of the calculus has the symmetric role of improving sequent calculus in dealing with tail-active permutative conversions

    Facts, Values and Quanta

    Full text link
    Quantum mechanics is a fundamentally probabilistic theory (at least so far as the empirical predictions are concerned). It follows that, if one wants to properly understand quantum mechanics, it is essential to clearly understand the meaning of probability statements. The interpretation of probability has excited nearly as much philosophical controversy as the interpretation of quantum mechanics. 20th century physicists have mostly adopted a frequentist conception. In this paper it is argued that we ought, instead, to adopt a logical or Bayesian conception. The paper includes a comparison of the orthodox and Bayesian theories of statistical inference. It concludes with a few remarks concerning the implications for the concept of physical reality.Comment: 30 pages, AMS Late

    An Alternative Interpretation of Statistical Mechanics

    Get PDF
    In this paper I propose an interpretation of classical statistical mechanics that centers on taking seriously the idea that probability measures represent complete states of statistical mechanical systems. I show how this leads naturally to the idea that the stochasticity of statistical mechanics is associated directly with the observables of the theory rather than with the microstates (as traditional accounts would have it). The usual assumption that microstates are representationally significant in the theory is therefore dispensable, a consequence which suggests interesting possibilities for developing non-equilibrium statistical mechanics and investigating inter-theoretic answers to the foundational questions of statistical mechanics

    “A very orderly retreat”: Democratic transition in East Germany, 1989-90

    Get PDF
    East Germany's 1989-90 democratisation is among the best known of East European transitions, but does not lend itself to comparative analysis, due to the singular way in which political reform and democratic consolidation were subsumed by Germany's unification process. Yet aspects of East Germany's democratisation have proved amenable to comparative approaches. This article reviews the comparative literature that refers to East Germany, and finds a schism between those who designate East Germany's transition “regime collapse” and others who contend that it exemplifies “transition through extrication”. It inquires into the merits of each position and finds in favour of the latter. Drawing on primary and secondary literature, as well as archival and interview sources, it portrays a communist elite that was, to a large extent, prepared to adapt to changing circumstances and capable of learning from “reference states” such as Poland. Although East Germany was the Soviet state in which the positions of existing elites were most threatened by democratic transition, here too a surprising number succeeded in maintaining their position while filing across the bridge to market society. A concluding section outlines the alchemy through which their bureaucratic power was transmuted into property and influence in the “new Germany”

    Towards a canonical classical natural deduction system

    Get PDF
    This paper studies a new classical natural deduction system, presented as a typed calculus named \lml. It is designed to be isomorphic to Curien-Herbelin's calculus, both at the level of proofs and reduction, and the isomorphism is based on the correct correspondence between cut (resp. left-introduction) in sequent calculus, and substitution (resp. elimination) in natural deduction. It is a combination of Parigot's λμ\lambda\mu-calculus with the idea of ``coercion calculus'' due to Cervesato-Pfenning, accommodating let-expressions in a surprising way: they expand Parigot's syntactic class of named terms. This calculus aims to be the simultaneous answer to three problems. The first problem is the lack of a canonical natural deduction system for classical logic. \lml is not yet another classical calculus, but rather a canonical reflection in natural deduction of the impeccable treatment of classical logic by sequent calculus. The second problem is the lack of a formalization of the usual semantics of Curien-Herbelin's calculus, that explains co-terms and cuts as, respectively, contexts and hole-filling instructions. The mentioned isomorphism is the required formalization, based on the precise notions of context and hole-expression offered by \lml. The third problem is the lack of a robust process of ``read-back'' into natural deduction syntax of calculi in the sequent calculus format, that affects mainly the recent proof-theoretic efforts of derivation of λ\lambda-calculi for call-by-value. An isomorphic counterpart to the QQ-subsystem of Curien-Herbelin's-calculus is derived, obtaining a new λ\lambda-calculus for call-by-value, combining control and let-expressions.Fundação para a Ciência e a Tecnologia (FCT

    Unknown Quantum States: The Quantum de Finetti Representation

    Full text link
    We present an elementary proof of the quantum de Finetti representation theorem, a quantum analogue of de Finetti's classical theorem on exchangeable probability assignments. This contrasts with the original proof of Hudson and Moody [Z. Wahrschein. verw. Geb. 33, 343 (1976)], which relies on advanced mathematics and does not share the same potential for generalization. The classical de Finetti theorem provides an operational definition of the concept of an unknown probability in Bayesian probability theory, where probabilities are taken to be degrees of belief instead of objective states of nature. The quantum de Finetti theorem, in a closely analogous fashion, deals with exchangeable density-operator assignments and provides an operational definition of the concept of an ``unknown quantum state'' in quantum-state tomography. This result is especially important for information-based interpretations of quantum mechanics, where quantum states, like probabilities, are taken to be states of knowledge rather than states of nature. We further demonstrate that the theorem fails for real Hilbert spaces and discuss the significance of this point.Comment: 30 pages, 2 figure

    Reciprocity as a foundation of financial economics

    Get PDF
    This paper argues that the subsistence of the fundamental theorem of contemporary financial mathematics is the ethical concept ‘reciprocity’. The argument is based on identifying an equivalence between the contemporary, and ostensibly ‘value neutral’, Fundamental Theory of Asset Pricing with theories of mathematical probability that emerged in the seventeenth century in the context of the ethical assessment of commercial contracts in a framework of Aristotelian ethics. This observation, the main claim of the paper, is justified on the basis of results from the Ultimatum Game and is analysed within a framework of Pragmatic philosophy. The analysis leads to the explanatory hypothesis that markets are centres of communicative action with reciprocity as a rule of discourse. The purpose of the paper is to reorientate financial economics to emphasise the objectives of cooperation and social cohesion and to this end, we offer specific policy advice

    Less is Different: Emergence and Reduction Reconciled

    Get PDF
    This is a companion to another paper. Together they rebut two widespread philosophical doctrines about emergence. The first, and main, doctrine is that emergence is incompatible with reduction. The second is that emergence is supervenience; or more exactly, supervenience without reduction. In the other paper, I develop these rebuttals in general terms, emphasising the second rebuttal. Here I discuss the situation in physics, emphasising the first rebuttal. I focus on limiting relations between theories and illustrate my claims with four examples, each of them a model or a framework for modelling, from well-established mathematics or physics. I take emergence as behaviour that is novel and robust relative to some comparison class. I take reduction as, essentially, deduction. The main idea of my first rebuttal will be to perform the deduction after taking a limit of some parameter. Thus my first main claim will be that in my four examples (and many others), we can deduce a novel and robust behaviour, by taking the limit, N goes to infinity, of a parameter N. But on the other hand, this does not show that that the infinite limit is "physically real", as some authors have alleged. For my second main claim is that in these same examples, there is a weaker, yet still vivid, novel and robust behaviour that occurs before we get to the limit, i.e. for finite N. And it is this weaker behaviour which is physically real. My examples are: the method of arbitrary functions (in probability theory); fractals (in geometry); superselection for infinite systems (in quantum theory); and phase transitions for infinite systems (in statistical mechanics).Comment: 75 p

    Henri Poincaré: The Status of Mechanical Explanations and the Foundations of Statistical Mechanics

    Get PDF
    The first goal of this paper is to show the evolution of Poincaré’s opinion on the mechanistic reduction of the principles of thermodynamics, placing it in the context of the science of his time. The second is to present some of his work in 1890 on the foundations of statistical mechanics. He became interested first in thermodynamics and its relation with mechanics, drawing on the work of Helm-holtz on monocyclic systems. After a period of skepticism concerning the kinetic theory, he read some of Maxwell’s memories and contributed to the foundations of statistical mechanics. I also show that Poincaré's contributions to the founda-tions of statistical mechanics are closely linked to his work in celestial mechanics and its interest in probability theory and its role in physics
    corecore